Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pytorch parallelism #5916

Merged
merged 17 commits into from
Jan 21, 2025
Merged

Conversation

andersooi
Copy link
Contributor

@andersooi andersooi commented Jan 4, 2025

Description

  • Added new concept entry for PyTorch Distributed Data Parallelism

Issue Solved

Closes #5871

Type of Change

  • Adding a new entry

Checklist

  • All writings are my own.
  • My entry follows the Codecademy Docs style guide.
  • My changes generate no new warnings.
  • I have performed a self-review of my own writing and code.
  • I have checked my entry and corrected any misspellings.
  • I have made corresponding changes to the documentation if needed.
  • I have confirmed my changes are not being pushed from my forked main branch.
  • I have confirmed that I'm pushing from a new branch named after the changes I'm making.
  • I have linked any issues that are relevant to this PR in the Issues Solved section.

@Radhika-okhade Radhika-okhade self-assigned this Jan 7, 2025
@Radhika-okhade Radhika-okhade added status: under review Issue or PR is currently being reviewed pytorch PyTorch labels Jan 7, 2025
@Radhika-okhade
Copy link
Collaborator

Hey! @andersooi Please correct the file path. This is the correct path docs/content/pytorch/concepts/distributed-data-parallelism/distributed-data-parallelism.md

@Radhika-okhade
Copy link
Collaborator

Hey! @andersooi, Thank you for contributing to Codecademy docs. I have made a few suggestions; please go through them and make the necessary changes.

@Radhika-okhade Radhika-okhade added status: waiting for author and removed status: under review Issue or PR is currently being reviewed labels Jan 15, 2025
@Radhika-okhade
Copy link
Collaborator

The entry looks good for a second round of review. @andersooi

Copy link
Collaborator

@PragatiVerma18 PragatiVerma18 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @andersooi, thanks for contributing to docs. The entry looks good. I have a few suggestions. Please check them and make the required changes, and then we will be good to merge this PR. Let me know if you need any help.

Copy link
Collaborator

@PragatiVerma18 PragatiVerma18 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @andersooi, this entry is now good to be merged! 🚀

@PragatiVerma18 PragatiVerma18 merged commit 6e8a776 into Codecademy:main Jan 21, 2025
7 checks passed
Copy link

👋 @andersooi
You have contributed to Codecademy Docs, and we would like to know more about you and your experience.
Please take a minute to fill out this four question survey to help us better understand Docs contributions and how we can improve the experience for you and our learners.
Thank you for your help!

🎉 Your contribution(s) can be seen here:

https://www.codecademy.com/resources/docs/pytorch/distributed-data-parallelism
https://github.com/Codecademy/docs/blob/main/documentation/tags.md

Please note it may take a little while for changes to become visible.
If you're appearing as anonymous and want to be credited, visit the linked accounts page and ensure that your GitHub account is linked.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Concept Entry] PyTorch: Distributed Data Parallelism 
3 participants